Structured BFGS Method for Optimal Doubly Stochastic Matrix Approximation
نویسندگان
چکیده
Doubly stochastic matrix plays an essential role in several areas such as statistics and machine learning. In this paper we consider the optimal approximation of a square set doubly matrices. A structured BFGS method is proposed to solve dual primal problem. The resulting algorithm builds curvature information into diagonal components true Hessian, so that it takes only additional linear cost obtain descent direction based on gradient without having explicitly store inverse Hessian approximation. substantially fewer than quadratic complexity classical algorithm. Meanwhile, Newton-based line search presented for finding suitable step size, which practice uses existing knowledge one iteration. global convergence our established. We verify advantages approach both synthetic data real sets. experimental results demonstrate outperforms state-of-the-art solvers enjoys outstanding scalability.
منابع مشابه
Stochastic algorithms for solving structured low-rank matrix approximation problems
In this paper, we investigate the complexity of the numerical construction of the Hankel structured low-rank approximation (HSLRA) problem, and develop a family of algorithms to solve this problem. Briefly, HSLRA is the problem of finding the closest (in some pre-defined norm) rank r approximation of a given Hankel matrix, which is also of Hankel structure. We demonstrate that finding optimal s...
متن کاملAn asymptotic approximation for the permanent of a doubly stochastic matrix
A determinantal approximation is obtained for the permanent of a doubly stochastic matrix. For moderate-deviation matrix sequences, the asymptotic relative error is of order O(n−1). keywords: Doubly stochastic Dirichlet distribution; Maximum-likelihood projection; Sinkhorn projection
متن کاملMatrix Conditioning and Adaptive Simultaneous Perturbation Stochastic Approximation Method
This paper proposes a modification to the simultaneous per tu rba t ion stochastic approximation (SPSA) methods based on the comparisons made between the first o rder and the second order SPSA (1SPSA and 2SPSA) algori thms f rom the perspective of loss function Hessian. At finite iterations, the convergence rate depends on the matr ix conditioning of the loss function Hessian. It is shown that ...
متن کاملLow-Rank Doubly Stochastic Matrix Decomposition for Cluster Analysis
Cluster analysis by nonnegative low-rank approximations has experienced a remarkable progress in the past decade. However, the majority of such approximation approaches are still restricted to nonnegative matrix factorization (NMF) and suffer from the following two drawbacks: 1) they are unable to produce balanced partitions for large-scale manifold data which are common in real-world clusterin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2023
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v37i6.25877